Projection-free nonconvex stochastic optimization on Riemannian manifolds

نویسندگان

چکیده

Abstract We study stochastic projection-free methods for constrained optimization of smooth functions on Riemannian manifolds, i.e., with additional constraints beyond the parameter domain being a manifold. Specifically, we introduce Frank–Wolfe (Fw) nonconvex and geodesically convex problems. present algorithms both purely finite-sum For latter, develop variance-reduced methods, including adaptation recently proposed Spider technique. all settings, recover convergence rates that are comparable to best-known their Euclidean counterparts. Finally, discuss applications two classic tasks: computation Karcher mean positive definite matrices Wasserstein barycenters multivariate normal distributions. tasks, Fw yield state-of-the-art empirical performance.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Riemannian SVRG: Fast Stochastic Optimization on Riemannian Manifolds

We study optimization of finite sums of geodesically smooth functions on Riemannian manifolds. Although variance reduction techniques for optimizing finite-sums have witnessed tremendous attention in the recent years, existing work is limited to vector space problems. We introduce Riemannian SVRG (RSVRG), a new variance reduced Riemannian optimization method. We analyze RSVRG for both geodesica...

متن کامل

Optimization Techniques on Riemannian Manifolds

The techniques and analysis presented in this paper provide new methods to solve optimization problems posed on Riemannian manifolds. A new point of view is offered for the solution of constrained optimization problems. Some classical optimization techniques on Euclidean space are generalized to Riemannian manifolds. Several algorithms are presented and their convergence properties are analyzed...

متن کامل

Averaging Stochastic Gradient Descent on Riemannian Manifolds

We consider the minimization of a function defined on a Riemannian manifold M accessible only through unbiased estimates of its gradients. We develop a geometric framework to transform a sequence of slowly converging iterates generated from stochastic gradient descent (SGD) on M to an averaged iterate sequence with a robust and fast O(1/n) convergence rate. We then present an application of our...

متن کامل

Variance-Reduced and Projection-Free Stochastic Optimization

The Frank-Wolfe optimization algorithm has recently regained popularity for machine learning applications due to its projection-free property and its ability to handle structured constraints. However, in the stochastic learning setting, it is still relatively understudied compared to the gradient descent counterpart. In this work, leveraging a recent variance reduction technique, we propose two...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Ima Journal of Numerical Analysis

سال: 2021

ISSN: ['1464-3642', '0272-4979']

DOI: https://doi.org/10.1093/imanum/drab066